20 research outputs found

    On the probabilistic symbolic analysis of programs

    No full text
    Recently we have proposed symbolic execution techniques for the probabilistic analysis of programs. These techniques seek to quan- tify the probability of a program to satisfy a property of interest under a relevant usage profile. We describe recent advances in prob- abilistic symbolic analysis including handling of complex floating- point constraints and nondeterminism, and the use of statistical techniques for increased scalability

    Symbolic Quantitative Information Flow

    Get PDF
    acmid: 2382791 issue_date: November 2012 keywords: algorithms, security, verification numpages: 5acmid: 2382791 issue_date: November 2012 keywords: algorithms, security, verification numpages: 5acmid: 2382791 issue_date: November 2012 keywords: algorithms, security, verification numpages: 5acmid: 2382791 issue_date: November 2012 keywords: algorithms, security, verification numpages: 5acmid: 2382791 issue_date: November 2012 keywords: algorithms, security, verification numpages:

    Multi-run side-channel analysis using Symbolic Execution and Max-SMT

    Get PDF

    Model-counting approaches for nonlinear numerical constraints

    Get PDF
    Model counting is of central importance in quantitative rea- soning about systems. Examples include computing the probability that a system successfully accomplishes its task without errors, and measuring the number of bits leaked by a system to an adversary in Shannon entropy. Most previous work in those areas demonstrated their analysis on pro- grams with linear constraints, in which cases model counting is polynomial time. Model counting for nonlinear constraints is notoriously hard, and thus programs with nonlinear constraints are not well-studied. This paper surveys state-of-the-art techniques and tools for model counting with respect to SMT constraints, modulo the bitvector theory, since this theory is decidable, and it can express nonlinear constraints that arise from the analysis of computer programs. We integrate these techniques within the Symbolic Pathfinder platform and evaluate them on difficult nonlinear constraints generated from the analysis of cryptographic functions

    Compositional Verification for Autonomous Systems with Deep Learning Components

    Full text link
    As autonomy becomes prevalent in many applications, ranging from recommendation systems to fully autonomous vehicles, there is an increased need to provide safety guarantees for such systems. The problem is difficult, as these are large, complex systems which operate in uncertain environments, requiring data-driven machine-learning components. However, learning techniques such as Deep Neural Networks, widely used today, are inherently unpredictable and lack the theoretical foundations to provide strong assurance guarantees. We present a compositional approach for the scalable, formal verification of autonomous systems that contain Deep Neural Network components. The approach uses assume-guarantee reasoning whereby {\em contracts}, encoding the input-output behavior of individual components, allow the designer to model and incorporate the behavior of the learning-enabled components working side-by-side with the other components. We illustrate the approach on an example taken from the autonomous vehicles domain

    Symbolic Side-Channel Analysis for Probabilistic Programs

    Get PDF
    In this paper we describe symbolic side-channel analysis techniques for detecting and quantifying information leakage, given in terms of Shannon and Min Entropy. Measuring the precise leakage is challenging due to the randomness and noise often present in program executions and side-channel observations. We account for this noise by introducing additional (symbolic) program inputs which are interpreted probabilistically, using symbolic execution with parameterized model counting. We also explore an approximate sampling approach for increased scalability. In contrast to typical Monte Carlo techniques, our approach works by sampling symbolic paths, representing multiple concrete paths, and uses pruning to accelerate computation and guarantee convergence to the optimal results. The key novelty of our approach is to provide bounds on the leakage that are provably under- and over-approximating the real leakage. We implemented the techniques in the Symbolic PathFinder tool and we demonstrate them on Java programs

    Quantifying Information Leaks Using Reliability Analysis

    Get PDF
    acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4We report on our work-in-progress into the use of reliability analysis to quantify information leaks. In recent work we have proposed a software reliability analysis technique that uses symbolic execution and model counting to quantify the probability of reaching designated program states, e.g. assert violations, under uncertainty conditions in the environment. The technique has many applications beyond reliability analysis, ranging from program understanding and debugging to analysis of cyber-physical systems. In this paper we report on a novel application of the technique, namely Quantitative Information Flow analysis (QIF). The goal of QIF is to measure information leakage of a program by using information-theoretic metrics such as Shannon entropy or Renyi entropy. We exploit the model counting engine of the reliability analyzer over symbolic program paths, to compute an upper bound of the maximum leakage over all possible distributions of the confidential data. We have implemented our approach into a prototype tool, called QILURA, and explore its effectiveness on a number of case studie

    Synthesis of Adaptive Side-Channel Attacks.

    Get PDF
    We present symbolic analysis techniques for detecting vulnerabilities that are due to adaptive side-channel attacks, and synthesizing inputs that exploit the identified vulnerabilities. We start with a symbolic attack model that encodes succinctly all the side-channel attacks that an adversary can make. Using symbolic execution over this model, we generate a set of mathematical constraints, where each constraint characterizes the set of secret values that lead to the same sequence of side-channel measurements. We then compute the optimal attack, i.e, the attack that yields maximum leakage over the secret, by solving an optimization problem over the computed constraints. We use information-theoretic concepts such as channel capacity and Shannon entropy to quantify the leakage over multiple runs in the attack, where the measurements over the side channels form the observations that an adversary can use to try to infer the secret. We also propose greedy heuristics that generate the attack by exploring a portion of the symbolic attack model in each step. We implemented the techniques in Symbolic PathFinder and applied them to Java programs encoding web services, string manipulations and cryptographic functions, demonstrating how to synthesize optimal side-channel attacks

    Model counting for complex data structures

    No full text
    We extend recent approaches for calculating the probability of program behaviors, to allow model counting for complex data structures with numeric fields. We use symbolic execution with lazy initialization to compute the input structures leading to the occurrence of a target event, while keeping a symbolic representation of the constraints on the numeric data. Off-the-shelf model counting tools are used to count the solutions for numerical constraints and field bounds encoding data structure invariants are used to reduce the search space. The technique is implemented in the Symbolic PathFinder tool and evaluated on several complex data structures. Results show that the technique is much faster than an enumeration-based method that uses the Korat tool and also highlight the benefits of using the field bounds to speed up the analysis

    Reliability analysis in Symbolic Pathfinder: a brief summary

    No full text
    corecore